Learning from Noisy Labels with Complementary Loss Functions

نویسندگان

چکیده

Recent researches reveal that deep neural networks are sensitive to label noises hence leading poor generalization performance in some tasks. Although different robust loss functions have been proposed remedy this issue, they suffer from an underfitting problem, thus not sufficient learn accurate models. On the other hand, commonly used Cross Entropy (CE) loss, which shows high standard supervised learning (with clean supervision), is non-robust noise. In paper, we propose a general framework with complementary functions. our framework, CE and play roles joint objective as per their sufficiency robustness properties respectively. Specifically, find by exploiting memorization effect of networks, can easily filter out proportion hard samples generate reliable pseudo labels for easy samples, reduce noise quite low level. Then, simply on supervision original noisy supervision. procedure, guarantee optimization while be regarded supplement. Experimental results benchmark classification datasets indicate method helps achieve network training simultaneously.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning from Complementary Labels

Collecting labeled data is costly and thus is a critical bottleneck in real-world classification tasks. To mitigate the problem, we consider a complementary label, which specifies a class that a pattern does not belong to. Collecting complementary labels would be less laborious than ordinary labels since users do not have to carefully choose the correct class from many candidate classes. Howeve...

متن کامل

Learning with Noisy Labels

In this paper, we theoretically study the problem of binary classification in the presence of random classification noise — the learner, instead of seeing the true labels, sees labels that have independently been flipped with some small probability. Moreover, random label noise is class-conditional — the flip probability depends on the class. We provide two approaches to suitably modify any giv...

متن کامل

Learning with Biased Complementary Labels

In this paper we study the classi cation problem in which we have access to easily obtainable surrogate for the true labels, namely complementary labels, which specify classes that observations do not belong to. For example, if one is familiar with monkeys but not meerkats, a meerkat is easily identi ed as not a monkey, so “monkey” is annotated to the meerkat as a complementary label. Speci cal...

متن کامل

Interactive Learning from Multiple Noisy Labels

Interactive learning is a process in which a machine learning algorithm is provided with meaningful, well-chosen examples as opposed to randomly chosen examples typical in standard supervised learning. In this paper, we propose a new method for interactive learning from multiple noisy labels where we exploit the disagreement among annotators to quantify the easiness (or meaningfulness) of an ex...

متن کامل

Deep Learning from Noisy Image Labels with Quality Embedding

There is an emerging trend to leverage noisy image datasets in many visual recognition tasks. However, the label noise among the datasets severely degenerates the performance of deep learning approaches. Recently, one mainstream is to introduce the latent label to handle label noise, which has shown promising improvement in the network designs. Nevertheless, the mismatch between latent labels a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i11.17213